73 research outputs found
Instant messaging implications in the transition from a private consumer activity to a communication tool for business
Used to communicate important information and provide contact
outside of lectures to great effect, email has emerged as the
communication medium of choice. However, Instant Messaging is fast
emerging as a favourable world-wide method of communication with
its interactive nature and appealing user interfaces. However, can it be
as effective within the organisation and more importantly the
university as the now established email? This research paper will
explore how Instant Messaging is advancing from a private consumer
activity to a tool capable of improving communication within the
university and organisations.
Through the development of a controlled experiment the paper
examines the way in which Instant Messaging is being adopted and
how it compares to other forms of communication, including its
associated interrupt recovery time. The paper also looks at the extent
of disruption that Instant Messaging has on users and how users
reacted differently to new messages via different notification settings
Theory-based model of factors affecting information overload
As the volume of available information increases, individuals and organisations become overwhelmed by the plethora of information. This can reduce productivity and performance, hinder learning and innovation, affect decision making and well-being and cost organisations large amounts of money. This paper develops a new theory-based model of factors affecting information overload and provides a formula for calculating the extent of overload, potentially of use as a diagnostic tool supporting individual or organisational development.
Two methods for categorising factors that contribute to the overload phenomena are introduced. The first method separates those factors based on their direct or indirect effect on information overload, called intrinsic and extraneous factors respectively. The second method divides factors based on whether or not the factor increases or decreases information overload likelihood and tags those factors in terms of ‘For’ or ‘Against’. These new methods for categorisation not only assisted in the development of the conceptual model and in creating the formula, but could be used in other aspects of information overload research, such as finding and evaluating countermeasures to information overload. The model and the formula presented in the paper provide a significant contribution to the information overload body of research
Capturing and managing email knowledge.
In many successful organisations today, significant resources are
invested in training and development efforts exploring group
dynamics and effective team building. The challenge from a
knowledge management perspective is to explore how technology
could facilitate knowledge sharing (both tacit and explicit) in a
group context. The paper highlights the benefits of developing such
Knowledge Management tools to make better use of the
information contained within email messages, and shows how
organisations could become more effective by adopting such an
application
The design and evaluation of EKE, a semi-automated email knowledge extraction tool
This paper presents an approach to locating experts within organisations through the use of the indispensable communication medium and source of information, email. The approach was realised through the email expert locator architecture developed by the authors, which uses email content in the modelling of individuals' expertise profiles. The approach has been applied to a real-world application, EKE, and evaluated using focus group sessions and system trials. In this work, the authors report the findings obtained from the focus groups sessions. The aim of the sessions was to obtain information about the participants' perceptions, opinions, underlying attitudes, and recommendations with regard to the notion of exploiting email content for expertise profiling. The paper provides a review of the various approaches to expertise location that have been developed and highlights the end-users' perspectives on the usability and functionality of EKE and the socio-ethical challenges raised by its adoption from an industrial perspective. © 2012 Operational Research Society. All rights reserved
Investigation of polymer electrolyte membrane fuel cell internal behaviour during long term operation and its use in prognostics
This paper investigates the polymer electrolyte membrane (PEM) fuel cell internal behaviour variation at
different operating condition, with characterization test data taken at predefined inspection times, and
uses the determined internal behaviour evolution to predict the future PEM fuel cell performance. For
this purpose, a PEM fuel cell behaviour model is used, which can be related to various fuel cell losses. By
matching the model to the collected polarization curves from the PEM fuel cell system, the variation of
fuel cell internal behaviour can be obtained through the determined model parameters. From the results,
the source of PEM fuel cell degradation during its lifetime at different conditions can be better understood.
Moreover, with determined fuel cell internal behaviour, the future fuel cell performance can be
obtained by predicting the future model parameters. By comparing with prognostic results using
adaptive neuro fuzzy inference system (ANFIS), the proposed prognostic analysis can provide better
predictions for PEM fuel cell performance at dynamic condition, and with the understanding of variation
in PEM fuel cell internal behaviour, mitigation strategies can be designed to extend the fuel cell
performance
Swarm computational intelligence design for a high integrity protection system
The search meta-heuristic procedure that mimics the process of biological natural selection is an embedded part of artificial intelligence (AI). This is regularly used for obtaining the solution to some optimization problems such as the minimization of disastrous occurrence events in industries. Extra precautions are given to people and equipment operating in hazardous and harsh environments; thus there are safety systems designed to give the required, accurate, necessary and timely protections. There is hence the need to drastically reduce the probability of the occurrence of a system failure. A High Integrity Protection System (HIPS) is a safety device which could be installed on offshore facilities with the objective to mitigate a high pressure upsurge that has the potential to cause
immense harm and subsequently destroy the system. The aim of the research is to use a Particle Swarm Optimization (PSO)
approach to intelligently design the system in order to optimize and reduce the unavailability of the HIPS design. A Fault Tree Analysis (FTA) model is employed to build the HIPS structure. FTA is a top-down approach using Boolean logic operations that is used to analyze causes, investigates potential and likely faults and to quantify their contribution to system failure in the process of product design. Comparison is made between this HIPS-PSO approach and the previous work performed using a genetic algorithm(GA). Alongside from the simplicity in the design of the HIPS-PSO approach, a much faster execution time and a reduced system unavailability was obtained when
compared with the GA approach
Harvesting information from the Internet to construct ontologies
The paper evaluates the effectiveness of harvesting information from the internet to aid in the lowcost
construction of an ontology. The paper describes how a proof-of-concept called OntoRanch
was built, to harvest information and its relationships to construct an ontology. A systems
development methodology was adopted which recognises three main stages: concept development,
system building, and system evaluation. The evaluation took an interpretive hybrid approach of
using both a focus group and a questionnaire to evaluate the proof-of-concept OntoRanch. The
findings show that the approach of reusing information by harvesting it from the internet can provide
an effective self-sustaining process that enables ontologies to be constructed in a reduced amount
of time and cost
An intelligent novel tripartite - (PSO-GA-SA) optimization strategy
A solution approach for many challenging and non-differentiable optimization tasks in industries is the use of non-deterministic meta-heuristic methods. Some of these approaches include Particle Swarm Optimization (PSO), Genetic Algorithm (GA), and Simulated Annealing (SA). However, with the implementation usage of these robust and stochastic optimization approaches, there are still some predominant issues such as the problem of the potential solution being trapped in a local minima solution space. Other challenges include the untimely convergence and the slow rate of arriving at optimal solutions. In this research study, a tripartite version (PSO-GA-SA) is proposed to address these deficiencies. This algorithm is designed with the full exploration of all the capabilities of PSO, GA and SA functioning simultaneously with a high level of intelligent system techniques to exploit and exchange relevant population traits in real time without compromising the computational time. The design algorithm further incorporates a variable velocity component that introduces random intelligence depending on the fitness performance from one generation to the other. The robust design is validated with known mathematical test function models. There are substantial performance improvements when the novel PSO-GA-SA approach is subjected to three test functions used as case studies. The results obtained indicate that the new approach performs better than the individual methods from the fitness function deviation point of view and in terms of the total simulation time whilst operating with both a reduced number of generations and populations. Moreover, the new novel approach offers more beneficial trade-off between exploration and exploitation of PSO, GA and SA. This novel design is implemented using an object oriented programming approach and it is expected to be compatible with a variety of practical problems with specified input-output pairs coupled with constraints and limitations on the available resources
Deploying knowledge management and securing future sponsorship within a highly hierarchical 'role-based' organisational culture
This paper describes the latest research in Knowledge Management (KM) that is being carried out at the
headquarters of The Danwood Group, Lincoln, UK as part of a collaborative doctoral research initiative with
the Department of Computer Science, Loughborough University. The primary aim of this project is to develop a
practical, business-oriented approach to managing knowledge within an organisation.
The four key areas that have been recognised as critical to the success of a KM scheme are ‘strategy’,
‘technology’, ‘measurement’ and ‘culture’. The latter is being explored in this paper, where the authors give an
insight into the difficulties of deploying KM in an organisation that is characterised by its traditional ‘rolebased’
culture and highly hierarchical management structure.
Performing research in a commercial environment instantly highlights the need for practical outcomes.
Danwood, like many companies that invest into research projects, want to see tangible business-term results.
However, the benefits of KM are hard to demonstrate on such strict timescales, which would inherently result in
a growing resistance by top-level management towards further investment.
This paper suggests ‘KM activities’ should be labelled with terms that are relevant and conceivable by the
organisation and progressively integrated with mission-critical business processes that will generate faster
bottom-line results. The aim is to highlight KM as creditable for these benefits and secure future investment into
an ‘official’ KM scheme. The additional advantage of this approach is that it facilitates the need for a cautious,
moderately paced adaptation of KM techniques to the particular organisational culture
Reducing the effect of email interuption on employees
It is generally assumed that because it is not necessary to react to email messages
when they arrive, employees will read their messages in their own time with
minimum interruption to their work. This research has shown that email messages do
have some disruptive effect by interrupting the user. Employees at the Danwood
Group in the UK were monitored to see how they used email. It was found that most
employees had their email software check for incoming messages every 5 minutes and
responded to the arrival of a message within 6 seconds. A recovery time between
finishing reading the email and returning to normal work also existed though it was
shorter than published recovery times for a telephone interrupt.
This analysis has suggested that a number of methods can be employed to reduce this
interrupt effect. Employee training, changing the settings and modes of using the
email software and the introduction of a one line email facility are all shown to have
beneficial effects. This has led to a series of recommendations that will enable the
Danwood Group to make better use of email communication and increase employee
effectiveness
- …